756 research outputs found

    On stepdown control of the false discovery proportion

    Full text link
    Consider the problem of testing multiple null hypotheses. A classical approach to dealing with the multiplicity problem is to restrict attention to procedures that control the familywise error rate (FWERFWER), the probability of even one false rejection. However, if ss is large, control of the FWERFWER is so stringent that the ability of a procedure which controls the FWERFWER to detect false null hypotheses is limited. Consequently, it is desirable to consider other measures of error control. We will consider methods based on control of the false discovery proportion (FDPFDP) defined by the number of false rejections divided by the total number of rejections (defined to be 0 if there are no rejections). The false discovery rate proposed by Benjamini and Hochberg (1995) controls E(FDP)E(FDP). Here, we construct methods such that, for any γ\gamma and α\alpha, P{FDP>γ}≤αP\{FDP>\gamma \}\le \alpha. Based on pp-values of individual tests, we consider stepdown procedures that control the FDPFDP, without imposing dependence assumptions on the joint distribution of the pp-values. A greatly improved version of a method given in Lehmann and Romano \citer10 is derived and generalized to provide a means by which any sequence of nondecreasing constants can be rescaled to ensure control of the FDPFDP. We also provide a stepdown procedure that controls the FDRFDR under a dependence assumption.Comment: Published at http://dx.doi.org/10.1214/074921706000000383 in the IMS Lecture Notes--Monograph Series (http://www.imstat.org/publications/lecnotes.htm) by the Institute of Mathematical Statistics (http://www.imstat.org

    Stepup procedures for control of generalizations of the familywise error rate

    Full text link
    Consider the multiple testing problem of testing null hypotheses H1,...,HsH_1,...,H_s. A classical approach to dealing with the multiplicity problem is to restrict attention to procedures that control the familywise error rate (FWER\mathit{FWER}), the probability of even one false rejection. But if ss is large, control of the FWER\mathit{FWER} is so stringent that the ability of a procedure that controls the FWER\mathit{FWER} to detect false null hypotheses is limited. It is therefore desirable to consider other measures of error control. This article considers two generalizations of the FWER\mathit{FWER}. The first is the k−FWERk-\mathit{FWER}, in which one is willing to tolerate kk or more false rejections for some fixed k≥1k\geq 1. The second is based on the false discovery proportion (FDP\mathit{FDP}), defined to be the number of false rejections divided by the total number of rejections (and defined to be 0 if there are no rejections). Benjamini and Hochberg [J. Roy. Statist. Soc. Ser. B 57 (1995) 289--300] proposed control of the false discovery rate (FDR\mathit{FDR}), by which they meant that, for fixed α\alpha, E(FDP)≤αE(\mathit{FDP})\leq\alpha. Here, we consider control of the FDP\mathit{FDP} in the sense that, for fixed γ\gamma and α\alpha, P{FDP>γ}≤αP\{\mathit{FDP}>\gamma\}\leq \alpha. Beginning with any nondecreasing sequence of constants and pp-values for the individual tests, we derive stepup procedures that control each of these two measures of error control without imposing any assumptions on the dependence structure of the pp-values. We use our results to point out a few interesting connections with some closely related stepdown procedures. We then compare and contrast two FDP\mathit{FDP}-controlling procedures obtained using our results with the stepup procedure for control of the FDR\mathit{FDR} of Benjamini and Yekutieli [Ann. Statist. 29 (2001) 1165--1188].Comment: Published at http://dx.doi.org/10.1214/009053606000000461 in the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    On the uniform asymptotic validity of subsampling and the bootstrap

    Full text link
    This paper provides conditions under which subsampling and the bootstrap can be used to construct estimators of the quantiles of the distribution of a root that behave well uniformly over a large class of distributions P\mathbf{P}. These results are then applied (i) to construct confidence regions that behave well uniformly over P\mathbf{P} in the sense that the coverage probability tends to at least the nominal level uniformly over P\mathbf{P} and (ii) to construct tests that behave well uniformly over P\mathbf{P} in the sense that the size tends to no greater than the nominal level uniformly over P\mathbf{P}. Without these stronger notions of convergence, the asymptotic approximations to the coverage probability or size may be poor, even in very large samples. Specific applications include the multivariate mean, testing moment inequalities, multiple testing, the empirical process and U-statistics.Comment: Published in at http://dx.doi.org/10.1214/12-AOS1051 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    The AirWand: Design and Characterization of a Large-Workspace Haptic Device

    Get PDF
    Almost all commercially available haptic interfaces share a common pitfall, a small shoebox-sized workspace; these devices typically rely on rigid-link manipulator design concepts. In this paper we outline our design for a new kinesthetic haptic system that drastically increases the usable haptic workspace. We present a proof-of-concept prototype, along with our analysis of its capabilities. Our design uses optical tracking to sense the position of the device, and air jet actuation to generate forces. By combining these two technologies, we are able to detach our device from the ground, thus sidestepping many problems that have plagued traditional haptic devices including workspace size, friction, and inertia. We show that optical tracking and air jet actuation successfully enable kinesthetic haptic interaction with virtual environments. Given an appropriately large volume high-pressure air source, and a reasonably high speed tracking system, this design paradigm has many desirable qualities when compared to traditional haptic design schemes

    Automatic Filter Design for Synthesis of Haptic Textures from Recorded Acceleration Data

    Get PDF
    Sliding a probe over a textured surface generates a rich collection of vibrations that one can easily use to create a mental model of the surface. Haptic virtual environments attempt to mimic these real interactions, but common haptic rendering techniques typically fail to reproduce the sensations that are encountered during texture exploration. Past approaches have focused on building a representation of textures using a priori ideas about surface properties. Instead, this paper describes a process of synthesizing probe-surface interactions from data recorded from real interactions. We explain how to apply the mathematical principles of Linear Predictive Coding (LPC) to develop a discrete transfer function that represents the acceleration response under specific probe-surface interaction conditions. We then use this predictive transfer function to generate unique acceleration signals of arbitrary length. In order to move between transfer functions from different probe-surface interaction conditions, we develop a method for interpolating the variables involved in the texture synthesis process. Finally, we compare the results of this process with real recorded acceleration signals, and we show that the two correlate strongly in the frequency domain

    Optimal testing of multiple hypotheses with common effect direction

    Get PDF
    We present a theoretical basis for testing related endpoints. Typically, it is known how to construct tests of the individual hypotheses, but not how to combine them into a multiple test procedure that controls the familywise error rate. Using the closure method, we emphasize the role of consonant procedures, from an interpretive as well as a theoretical viewpoint. Surprisingly, even if each intersection test has an optimality property, the overall procedure obtained by applying closure to these tests may be inadmissible. We introduce a new procedure, which is consonant and has a maximin property under the normal model. The results are then applied to PROactive, a clinical trial designed to investigate the effectiveness of a glucose-lowering drug on macrovascular outcomes among patients with type 2 diabete

    Dimensional Reduction of High-Frequencey Accelerations for Haptic Rendering

    Get PDF
    Haptics research has seen several recent efforts at understanding and recreating real vibrations to improve the quality of haptic feedback in both virtual environments and teleoperation. To simplify the modeling process and enable the use of single-axis actuators, these previous efforts have used just one axis of a three-dimensional vibration signal, even though the main vibration mechanoreceptors in the hand are know to detect vibrations in all directions. Furthermore, the fact that these mechanoreceptors are largely insensitive to the direction of high-frequency vibrations points to the existence of a transformation that can reduce three-dimensional high-frequency vibration signals to a one-dimensional signal without appreciable perceptual degradation. After formalizing the requirements for this transformation, this paper describes and compares several candidate methods of varying degrees of sophistication, culminating in a novel frequency-domain solution that performs very well on our chosen metrics

    High Frequency Acceleration Feedback Significantly Increases the Realism of Haptically Rendered Textured Surfaces

    Get PDF
    Almost every physical interaction generates high frequency vibrations, especially if one of the objects is a rigid tool. Previous haptics research has hinted that the inclusion or exclusion of these signals plays a key role in the realism of haptically rendered surface textures, but this connection has not been formally investigated until now. This paper presents a human subject study that compares the performance of a variety of surface rendering algorithms for a master-slave teleoperation system; each controller provides the user with a different combination of position and acceleration feedback, and subjects compared the renderings with direct tool-mediated exploration of the real surface. We use analysis of variance to examine quantitative performance metrics and qualitative realism ratings across subjects. The results of this study show that algorithms that include high-frequency acceleration feedback in combination with position feedback achieve significantly higher realism ratings than traditional position feedback alone. Furthermore, we present a frequency-domain metric for quantifying a controller\u27s acceleration feedback performance; given a constant surface stiffness, the median of this metric across subjects was found to have a significant positive correlation with median realism rating
    • …
    corecore